VMC Optimization: SGA vs. fixed sample reweighting
In VMC, we parameterize the wave function and vary the parameters until the minimum energy (or variance) is reached. Fixed sample reweighting generates a number of configurations from the MC process, and then uses a least squares minimization algorithm and a reweighting estimate for the energy to minimize the variance of the energies.
The Stochastic Gradient Approximation (SGA) takes a noisy estimate of the gradient and moves in that direction ("downhill"). The amount it moves is scaled by a variable that decreases each iteration (usually 1/n). It is guaranteed to converge, but it may take a while.
I don't like the fixed sample reweighting (FSR) method on aesthetic grounds. SGA feels more in line with the "Monte Carlo spirit". FSR feels like it is taking very coarse steps: sample configurations for a while, stop and do a minimization, repeat these two steps for a few iterations. Each step is large, and results in significant improvement. The SGA, on the other hand takes a large number of small steps. Each individual step doesn't give much improvement (or any improvement at all), but the steps accumulate to give a good answer.
Aesthetic considerations aside, FSR requires no analytic gradients for efficient implementation. SGA needs analytic gradients for most of the parameters (you could do numerical gradients, but the time gets much worse)
FSR does variance minimization, and SGA does energy minimization - there may be some differences in the resulting quality of the wave functions (JCP 112, 4935)